Last data update: Apr 29, 2024. (Total: 46658 publications since 2009)
Records 1-23 (of 23 Records) |
Query Trace: Seymour C[original query] |
---|
Dietary Sources of Plasma trans Fatty Acids among Adults in the United States: NHANES 2009-2010
Li C , Richter P , Cobb LK , Kuiper HC , Seymour J , Vesper HW . Curr Dev Nutr 2021 5 (5) nzab063 BACKGROUND: Intake of trans fatty acids (TFAs) increases LDL cholesterol, decreases HDL cholesterol, and increases the risk of heart disease morbidity and mortality. Many food products potentially contain industrially produced or ruminant TFAs. However, little is known about the dietary sources of plasma TFA concentrations. OBJECTIVE: The objective of this study was to examine associations between foods consumed and plasma TFA concentrations using 24-h dietary recall data and plasma TFA measures among adults aged ≥20 y who participated in the NHANES 2009-2010 in the United States. METHODS: Over 4400 food products in the dietary interview data were categorized into 32 food and beverage groups/subgroups. Four major plasma TFAs (palmitelaidic acid, elaidic acid, vaccenic acid, linolelaidic acid) and the sum of the 4 TFAs (sumTFAs) were analyzed using GC-MS. Multivariable linear regression analyses were conducted to identify associations of plasma TFAs with all 32 food and beverage groups/subgroups, controlling for the potential confounding effects of 11 demographic, socioeconomic, behavioral, lifestyle, and health-related variables. RESULTS: Consumption of the following food groups/subgroups was significantly associated with elevated plasma TFA concentrations: cream substitutes (P < 0.001 for palmitelaidic acid, elaidic acid, vaccenic acid, and sumTFAs); cakes, cookies, pastries, and pies (P < 0.001 for elaidic acid, vaccenic acid, and sumTFAs; P < 0.05 for linolelaidic acid); milk and milk desserts (P < 0.01 for palmitelaidic acid and vaccenic acid; P < 0.05 for linolelaidic acid and sumTFAs); beef/veal, lamb/goat, and venison/deer (P < 0.01 for vaccenic acid; P < 0.05 for sumTFAs); and butters (P < 0.001 for palmitelaidic acid and vaccenic acid; P < 0.05 for sumTFAs). CONCLUSIONS: The findings suggest that the above 5 food groups/subgroups could be the main dietary sources of plasma TFAs among adults in the United States in 2009-2010. |
"Submergence" of Western equine encephalitis virus: Evidence of positive selection argues against genetic drift and fitness reductions.
Bergren NA , Haller S , Rossi SL , Seymour RL , Huang J , Miller AL , Bowen RA , Hartman DA , Brault AC , Weaver SC . PLoS Pathog 2020 16 (2) e1008102 Understanding the circumstances under which arboviruses emerge is critical for the development of targeted control and prevention strategies. This is highlighted by the emergence of chikungunya and Zika viruses in the New World. However, to comprehensively understand the ways in which viruses emerge and persist, factors influencing reductions in virus activity must also be understood. Western equine encephalitis virus (WEEV), which declined during the late 20th century in apparent enzootic circulation as well as equine and human disease incidence, provides a unique case study on how reductions in virus activity can be understood by studying evolutionary trends and mechanisms. Previously, we showed using phylogenetics that during this period of decline, six amino acid residues appeared to be positively selected. To assess more directly the effect of these mutations, we utilized reverse genetics and competition fitness assays in the enzootic host and vector (house sparrows and Culex tarsalis mosquitoes). We observed that the mutations contemporary with reductions in WEEV circulation and disease that were non-conserved with respect to amino acid properties had a positive effect on enzootic fitness. We also assessed the effects of these mutations on virulence in the Syrian-Golden hamster model in relation to a general trend of increased virulence in older isolates. However, no change effect on virulence was observed based on these mutations. Thus, while WEEV apparently underwent positive selection for infection of enzootic hosts, residues associated with mammalian virulence were likely eliminated from the population by genetic drift or negative selection. These findings suggest that ecologic factors rather than fitness for natural transmission likely caused decreased levels of enzootic WEEV circulation during the late 20th century. |
Application of the fentanyl analog screening kit toward the identification of emerging synthetic opioids in human plasma and urine by LC-QTOF
Krajewski LC , Swanson KD , Bragg WA , Shaner RL , Seymour C , Carter MD , Hamelin EI , Johnson RC . Toxicol Lett 2020 320 87-94 Human exposures to fentanyl analogs, which significantly contribute to the ongoing U.S. opioid overdose epidemic, can be confirmed through the analysis of clinical samples. Our laboratory has developed and evaluated a qualitative approach coupling liquid chromatography and quadrupole time-of-flight mass spectrometry (LC-QTOF) to address novel fentanyl analogs and related compounds using untargeted, data-dependent acquisition. Compound identification was accomplished by searching against a locally-established mass spectral library of 174 fentanyl analogs and metabolites. Currently, our library can identify 150 fentanyl-related compounds from the Fentanyl Analog Screening (FAS) Kit), plus an additional 25 fentanyl-related compounds from individual purchases. Plasma and urine samples fortified with fentanyl-related compounds were assessed to confirm the capabilities and intended use of this LC-QTOF method. For fentanyl, 8 fentanyl-related compounds and naloxone, lower reportable limits (LRL100), defined as the lowest concentration with 100 % true positive rate (n = 12) within clinical samples, were evaluated and range from 0.5 ng/mL to 5.0 ng/mL for urine and 0.25 ng/mL to 2.5 ng/mL in plasma. The application of this high resolution mass spectrometry (HRMS) method enables the real-time detection of known and emerging synthetic opioids present in clinical samples. |
Designing traceable opioid material kits to improve laboratory testing during the U.S. opioid overdose crisis
Mojica MA , Carter MD , Isenberg SL , Pirkle JL , Hamelin EI , Shaner RL , Seymour C , Sheppard CI , Baldwin GT , Johnson RC . Toxicol Lett 2019 317 53-58 In 2017, the U.S. Department of Health and Human Services and the White House declared a public health emergency to address the opioid crisis (Hargan, 2017). On average, 192 Americans died from drug overdoses each day in 2017; 130 (67%) of those died specifically because of opioids (Scholl et al., 2019). Since 2013, there have been significant increases in overdose deaths involving synthetic opioids - particularly those involving illicitly-manufactured fentanyl. The U.S. Drug Enforcement Administration (DEA) estimates that 75% of all opioid identifications are illicit fentanyls (DEA, 2018b). Laboratories are routinely asked to confirm which fentanyl or other opioids are involved in an overdose or encountered by first responders. It is critical to identify and classify the types of drugs involved in an overdose, how often they are involved, and how that involvement may change over time. Health care providers, public health professionals, and law enforcement officers need to know which opioids are in use to treat, monitor, and investigate fatal and non-fatal overdoses. By knowing which drugs are present, appropriate prevention and response activities can be implemented. Laboratory testing is available for clinically used and widely recognized opioids. However, there has been a rapid expansion in new illicit opioids, particularly fentanyl analogs that may not be addressed by current laboratory capabilities. In order to test for these new opioids, laboratories require reference standards for the large number of possible fentanyls. To address this need, the Centers for Disease Control and Prevention (CDC) developed the Traceable Opioid Material( section sign) Kits product line, which provides over 150 opioid reference standards, including over 100 fentanyl analogs. These kits were designed to dramatically increase laboratory capability to confirm which opioids are on the streets and causing deaths. The kits are free to U.S based laboratories in the public, private, clinical, law enforcement, research, and public health domains. |
Cemented paste backfill geomechanics at a narrow-vein underhand cut-and-fill mine
Raffaldi MJ , Seymour JB , Richardson J , Zahl E , Board M . Rock Mech Rock Eng 2019 52 (12) 4925-4940 Underhand cut-and-fill mining has allowed for the safe extraction of ore in many mines operating in weak rock or highly stressed, rockburst-prone ground conditions. However, the design of safe backfill undercuts is typically based on historical experience at mine operations and on the strength requirements derived from analytical beam equations. In situ measurements in backfill are not commonplace, largely due to challenges associated with instrumenting harsh mining environments. In deep, narrow-vein mines, large deformations and induced stresses fracture the cemented fill, often damaging the instruments and preventing long-term measurements. Hecla Mining Company and the Spokane Mining Research Division of the National Institute for Occupational Safety and Health (NIOSH) have worked collaboratively for several years to better quantify the geomechanics of cemented paste backfill (CPB), thereby improving safety in underhand stopes. A significant focus of this work has been an extensive in situ backfill instrumentation program to monitor long-term stope closure and induced backfill stress. Rugged and durable custom-designed closure meters were developed, allowing measurements to be taken for up to five successive undercuts and measuring closures of more than 50 cm and horizontal fill pressures up to 5.5 MPa. These large stope closures require the stress–strain response of the fill to be considered in design, rather than to rely solely on traditional methods of backfill span design based on intact fill strength. Furthermore, long-term instrument response shows a change in behavior after 13–14% strain, indicating a transition from shear yielding of the intact, cemented material to compaction of the porosity between sand grains, typical of uncemented sand fills. This strain-hardening behavior is important for mine design purposes, particularly for the use of numerical models to simulate regional rock support and stress redistribution. These quantitative measurements help justify long-standing assumptions regarding the role of backfill in ground support and will be useful for other mines operating under similar conditions. |
Long-term stability of a 13.730.5-m (45100-ft) undercut span beneath cemented rockfill at the Turquoise Ridge Mine, Nevada
Seymour JB , Martin LA , Raffaldi MJ , Warren SN , Sandbak LA . Rock Mech Rock Eng 2019 2019 1-17 In 2001, researchers from the National Institute for Occupational Safety and Health (NIOSH) installed instruments at the Turquoise Ridge Mine in cooperation with Placer Dome, Inc. to monitor the geomechanical behavior and stability of a cemented rockfill (CRF) sill and the surrounding host rock during test mining of a large undercut span beneath backfill. Six parallel, adjacent drifts were mined and backfilled to construct a CRF sill, approximately 22.9 m (75 ft) wide by 30.5 m (100 ft) long. The sill was then partially undercut, successfully creating a 13.7-m (45-ft) wide by 30.5-m (100-ft) long span beneath the CRF. Only small vertical displacements were measured in the overlying host rock during mining, with most of the movement occurring at shallow depths in the mine roof. Because the back above the CRF sill remained stable, the majority of the mining-induced stress was transferred to the host rock abutments rather than to the backfilled drifts. During retreat mining of the undercut span, the CRF sill and the mine roof remained stable. Most of the measured vertical displacement was caused by separation of the backfill from the overlying host rock, or deflection of the CRF sill, which was comparable to the deflection of a monolithic, elastic plate having similar dimensions, material properties, and undercut spans. The CRF sill moved in mass as a single unit rather than as individual drift segments, and the vertical cold joints between adjacent backfill drifts did not adversely affect their stability. Additional measurements collected from the instruments have shown that the backfill span is still intact and in stable condition more than 16 years after the completion of undercut mining. Displacements in the mine roof and abutments have stabilized, and vertical stress and deformation within the CRF have generally leveled off or decreased. Although only slight mining-induced loads were transferred to the backfilled drifts, the CRF has confined the abutment ribs and mine roof, thereby improving their long-term stability. Results of compressive and tensile strength tests conducted with CRF samples from the test site indicate that the long-term compressive strength gain for CRF is similar to that of concrete, and that the tensile-to-compressive strength ratio for CRF is about 1/6 rather than 1/10. Assuming the in-place CRF gained strength at the same rate as the lab samples, an analytical analysis of the flexural stability of the CRF undercut span shows that the Factor of Safety for the span should have logically increased over time. By providing a better understanding of the long-term strength properties and geomechanical behavior of CRF, these research findings help improve the methods that are used for designing stable, long-term undercut entries beneath cemented backfill. |
Sepsis surveillance using adult sepsis events simplified eSOFA criteria versus sepsis-3 sequential organ failure assessment criteria
Rhee C , Zhang Z , Kadri SS , Murphy DJ , Martin GS , Overton E , Seymour CW , Angus DC , Dantes R , Epstein L , Fram D , Schaaf R , Wang R , Klompas M . Crit Care Med 2019 47 (3) 307-314 OBJECTIVES: Sepsis-3 defines organ dysfunction as an increase in the Sequential Organ Failure Assessment score by greater than or equal to 2 points. However, some Sequential Organ Failure Assessment score components are not routinely recorded in all hospitals' electronic health record systems, limiting its utility for wide-scale sepsis surveillance. The Centers for Disease Control and Prevention recently released the Adult Sepsis Event surveillance definition that includes simplified organ dysfunction criteria optimized for electronic health records (eSOFA). We compared eSOFA versus Sequential Organ Failure Assessment with regard to sepsis prevalence, overlap, and outcomes. DESIGN: Retrospective cohort study. SETTING: One hundred eleven U.S. hospitals in the Cerner HealthFacts dataset. PATIENTS: Adults hospitalized in 2013-2015. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: We identified clinical indicators of presumed infection (blood cultures and antibiotics) concurrent with either: 1) an increase in Sequential Organ Failure Assessment score by 2 or more points (Sepsis-3) or 2) 1 or more eSOFA criteria: vasopressor initiation, mechanical ventilation initiation, lactate greater than or equal to 2.0 mmol/L, doubling in creatinine, doubling in bilirubin to greater than or equal to 2.0 mg/dL, or greater than or equal to 50% decrease in platelet count to less than 100 cells/muL (Centers for Disease Control and Prevention Adult Sepsis Event). We compared area under the receiver operating characteristic curves for discriminating in-hospital mortality, adjusting for baseline characteristics. Of 942,360 patients in the cohort, 57,242 (6.1%) had sepsis by Sequential Organ Failure Assessment versus 41,618 (4.4%) by eSOFA. Agreement between sepsis by Sequential Organ Failure Assessment and eSOFA was good (Cronbach's alpha 0.81). Baseline characteristics and infectious diagnoses were similar, but mortality was higher with eSOFA (17.1%) versus Sequential Organ Failure Assessment (14.4%; p < 0.001) as was discrimination for mortality (area under the receiver operating characteristic curve, 0.774 vs 0.759; p < 0.001). Comparisons were consistent across subgroups of age, infectious diagnoses, and comorbidities. CONCLUSIONS: The Adult Sepsis Event's eSOFA organ dysfunction criteria identify a smaller, more severely ill sepsis cohort compared with the Sequential Organ Failure Assessment score, but with good overlap and similar clinical characteristics. Adult Sepsis Events may facilitate wide-scale automated sepsis surveillance that tracks closely with the more complex Sepsis-3 criteria. |
Determination of fentanyl analog exposure using dried blood spots with LC-MS-MS
Seymour C , Shaner RL , Feyereisen MC , Wharton RE , Kaplan P , Hamelin EI , Johnson RC . J Anal Toxicol 2018 43 (4) 266-276 Fentanyl, and the numerous drugs derived from it, are contributing to the opioid overdose epidemic currently underway in the USA. To identify human exposure to these growing public health threats, an LC-MS-MS method for 5 muL dried blood spots (DBS) was developed. This method was developed to detect exposure to 3-methylfentanyl, alfentanil, alpha-methylfentanyl, carfentanil, fentanyl, lofentanil, sufentanil, norcarfentanil, norfentanyl, norlofentanil, norsufentanil, and using a separate LC-MS-MS injection, cyclopropylfentanyl, acrylfentanyl, 2-furanylfentanyl, isobutyrylfentanyl, ocfentanil and methoxyacetylfentanyl. Preparation of materials into groups of compounds was used to accommodate an ever increasing need to incorporate newly identified fentanyls. This protocol was validated within a linear range of 1.00-100 ng/mL, with precision </=12% CV and accuracy >/=93%, as reported for the pooled blood QC samples, and limits of detection as low as 0.10 ng/mL. The use of DBS to assess fentanyl analog exposures can facilitate rapid sample collection, transport, and preparation for analysis that could enhance surveillance and response efforts in the ongoing opioid overdose epidemic. |
Variation in identifying sepsis and organ dysfunction using administrative versus electronic clinical data and impact on hospital outcome comparisons
Rhee C , Jentzsch MS , Kadri SS , Seymour CW , Angus DC , Murphy DJ , Martin GS , Dantes RB , Epstein L , Fiore AE , Jernigan JA , Danner RL , Warren DK , Septimus EJ , Hickok J , Poland RE , Jin R , Fram D , Schaaf R , Wang R , Klompas M . Crit Care Med 2018 47 (4) 493-500 OBJECTIVES: Administrative claims data are commonly used for sepsis surveillance, research, and quality improvement. However, variations in diagnosis, documentation, and coding practices for sepsis and organ dysfunction may confound efforts to estimate sepsis rates, compare outcomes, and perform risk adjustment. We evaluated hospital variation in the sensitivity of claims data relative to clinical data from electronic health records and its impact on outcome comparisons. DESIGN, SETTING, AND PATIENTS: Retrospective cohort study of 4.3 million adult encounters at 193 U.S. hospitals in 2013-2014. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: Sepsis was defined using electronic health record-derived clinical indicators of presumed infection (blood culture draws and antibiotic administrations) and concurrent organ dysfunction (vasopressors, mechanical ventilation, doubling in creatinine, doubling in bilirubin to >/= 2.0 mg/dL, decrease in platelets to < 100 cells/microL, or lactate >/= 2.0 mmol/L). We compared claims for sepsis prevalence and mortality rates between both methods. All estimates were reliability adjusted to account for random variation using hierarchical logistic regression modeling. The sensitivity of hospitals' claims data was low and variable: median 30% (range, 5-54%) for sepsis, 66% (range, 26-84%) for acute kidney injury, 39% (range, 16-60%) for thrombocytopenia, 36% (range, 29-44%) for hepatic injury, and 66% (range, 29-84%) for shock. Correlation between claims and clinical data was moderate for sepsis prevalence (Pearson coefficient, 0.64) and mortality (0.61). Among hospitals in the lowest sepsis mortality quartile by claims, 46% shifted to higher mortality quartiles using clinical data. Using implicit sepsis criteria based on infection and organ dysfunction codes also yielded major differences versus clinical data. CONCLUSIONS: Variation in the accuracy of claims data for identifying sepsis and organ dysfunction limits their use for comparing hospitals' sepsis rates and outcomes. Using objective clinical data may facilitate more meaningful hospital comparisons. |
Quantification of microcystin-LR in human urine by immunocapture liquid chromatography tandem mass spectrometry
Wharton RE , Ojeda-Torres G , Cunningham B , Feyereisen MC , Hill KL , Abbott NL , Seymour C , Hill D , Lang J , Hamelin EI , Johnson RC . Chem Res Toxicol 2018 31 (9) 898-903 Microcystins are toxins produced by many cyanobacteria species, which are often released into waterways during blue-green algal blooms in freshwater and marine habitats. The consumption of microcystin-contaminated water is a public health concern as these toxins are recognized tumor promoters and are hepatotoxic to humans and animals. A method to confirm human exposures to microcystins is needed; therefore, our laboratory has developed an immunocapture liquid chromatography tandem mass spectrometry (LC-MS/MS) method targeting the conserved adda portion of microcystins for the quantitation of a prevalent and highly toxic congener of microcystin, microcystin-LR (MC-LR). An acute exposure method was initially evaluated for accuracy and precision by analyzing calibrators and quality control (QC) samples ranging from 0.500 to 75.0 ng/mL in urine. All calibrators and QC samples characterized were within 15% of theoretical concentrations. An analysis of acutely exposed mouse urine samples using this method identified MC-LR levels from 10.7 to 33.9 ng/mL. Since human exposures are anticipated to result from low-dose or chronic exposures, a high-sensitivity method was validated with 20 calibration curves and QC samples ranging from 0.0100 to 7.50 ng/mL. Relative standard deviations (RSDs) and inaccuracies of these samples were within 15%, meeting United States Food and Drug Administration (FDA) guidelines for analytical methods, and the limit of detection was 0.00455 ng/mL. In conclusion, we have developed a method which can be used to address public health concerns by precisely and accurately measuring MC-LR in urine samples. |
Investigation of dried blood sampling with liquid chromatography tandem mass spectrometry to confirm human exposure to nerve agents
Shaner RL , Coleman RM , Schulze N , Platanitis K , Brown AA , Seymour C , Kaplan P , Perez J , Hamelin EI , Johnson RC . Anal Chim Acta 2018 1033 100-107 A method was developed to detect and quantify organophosphate nerve agent (OPNA) metabolites in dried blood samples. Dried blood spots (DBS) and microsampling devices are alternatives to traditional blood draws, allowing for safe handling, extended stability, reduced shipping costs, and potential self-sampling. DBS and microsamplers were evaluated for precision, accuracy, sensitivity, matrix effects, and extraction recovery following collection of whole blood containing five OPNA metabolites. The metabolites of VX, Sarin (GB), Soman (GD), Cyclosarin (GF), and Russian VX (VR) were quantitated from 5.0 to 500 ng mL-1 with precision of <=16% and accuracy between 93 and 108% for QC samples with controlled volumes. For unknown spot volumes, OPNA metabolite concentrations were normalized to total blood protein to improve interpretation of nerve agent exposures. This study provides data to support the use of DBS and microsamplers to collect critical exposure samples quickly, safely, and efficiently following large-scale chemical exposure events. |
Incidence and trends of sepsis in US hospitals using clinical vs claims data, 2009-2014
Rhee C , Dantes R , Epstein L , Murphy DJ , Seymour CW , Iwashyna TJ , Kadri SS , Angus DC , Danner RL , Fiore AE , Jernigan JA , Martin GS , Septimus E , Warren DK , Karcz A , Chan C , Menchaca JT , Wang R , Gruber S , Klompas M . JAMA 2017 318 (13) 1241-1249 Importance: Estimates from claims-based analyses suggest that the incidence of sepsis is increasing and mortality rates from sepsis are decreasing. However, estimates from claims data may lack clinical fidelity and can be affected by changing diagnosis and coding practices over time. Objective: To estimate the US national incidence of sepsis and trends using detailed clinical data from the electronic health record (EHR) systems of diverse hospitals. Design, Setting, and Population: Retrospective cohort study of adult patients admitted to 409 academic, community, and federal hospitals from 2009-2014. Exposures: Sepsis was identified using clinical indicators of presumed infection and concurrent acute organ dysfunction, adapting Third International Consensus Definitions for Sepsis and Septic Shock (Sepsis-3) criteria for objective and consistent EHR-based surveillance. Main Outcomes and Measures: Sepsis incidence, outcomes, and trends from 2009-2014 were calculated using regression models and compared with claims-based estimates using International Classification of Diseases, Ninth Revision, Clinical Modification codes for severe sepsis or septic shock. Case-finding criteria were validated against Sepsis-3 criteria using medical record reviews. Results: A total of 173690 sepsis cases (mean age, 66.5 [SD, 15.5] y; 77660 [42.4%] women) were identified using clinical criteria among 2901019 adults admitted to study hospitals in 2014 (6.0% incidence). Of these, 26061 (15.0%) died in the hospital and 10731 (6.2%) were discharged to hospice. From 2009-2014, sepsis incidence using clinical criteria was stable (+0.6% relative change/y [95% CI, -2.3% to 3.5%], P = .67) whereas incidence per claims increased (+10.3%/y [95% CI, 7.2% to 13.3%], P < .001). In-hospital mortality using clinical criteria declined (-3.3%/y [95% CI, -5.6% to -1.0%], P = .004), but there was no significant change in the combined outcome of death or discharge to hospice (-1.3%/y [95% CI, -3.2% to 0.6%], P = .19). In contrast, mortality using claims declined significantly (-7.0%/y [95% CI, -8.8% to -5.2%], P < .001), as did death or discharge to hospice (-4.5%/y [95% CI, -6.1% to -2.8%], P < .001). Clinical criteria were more sensitive in identifying sepsis than claims (69.7% [95% CI, 52.9% to 92.0%] vs 32.3% [95% CI, 24.4% to 43.0%], P < .001), with comparable positive predictive value (70.4% [95% CI, 64.0% to 76.8%] vs 75.2% [95% CI, 69.8% to 80.6%], P = .23). Conclusions and Relevance: In clinical data from 409 hospitals, sepsis was present in 6% of adult hospitalizations, and in contrast to claims-based analyses, neither the incidence of sepsis nor the combined outcome of death or discharge to hospice changed significantly between 2009-2014. The findings also suggest that EHR-based clinical data provide more objective estimates than claims-based data for sepsis surveillance. |
Quantitation of fentanyl analogs in dried blood spots by flow-through desorption coupled to online solid phase extraction tandem mass spectrometry
Shaner RL , Schulze ND , Seymour C , Hamelin EI , Thomas JD , Johnson RC . Anal Methods 2017 9 (25) 3876-3883 An automated dried blood spot (DBS) elution coupled with solid phase extraction and tandem mass spectrometric analysis for multiple fentanyl analogs was developed and assessed. This method confirms human exposures to fentanyl, sufentanil, carfentanil, alfentanil, lofentanil, α-methyl fentanyl, and 3-methyl fentanyl in blood with minimal sample volume and reduced shipping and storage costs. Seven fentanyl analogs were detected and quantitated from DBS made from venous blood. The calibration curve in matrix was linear in the concentration range of 1.0 ng mL-1 to 100 ng mL-1 with a correlation coefficient greater than 0.98 for all compounds. The limit of detection varied from 0.15 ng mL-1 to 0.66 ng mL-1 depending on target analyte. Analysis of the entire DBS minimized the effects of hematocrit on quantitation. All quality control materials evaluated resulted in <15% error; analytes with isotopically labeled internal standards had <15% RSD, while analytes without matching standards had 15-24% RSD. This method provides an automated means to detect seven fentanyl analogs, and quantitate four fentanyl analogs with the benefits of DBS at levels anticipated from an overdose of these potent opioids. © 2017 The Royal Society of Chemistry. |
Jackleg drill injuries
Clark CC , Benton DJ , Seymour JB , Martin LA . Min Eng 2016 68 (8) 57-62 The U.S. National Institute for Occupational Safety and Health (NIOSH) is conducting research on jackleg use and related accidents in underground metal mines. This paper provides an analysis and overview of jackleg drill usage, accidents, operational characteristics and alternatives, based on information from injury reports, legacy research, stakeholder input and published literature. The results indicate that jackleg drills are involved in more groundfall accidents in underground metal mines than any other drill, and jackleg-drill-related injuries are most prevalent at the face in the course of installing initial ground support. Practical mechanized alternatives to jackleg drills for drilling and bolting under incomplete support in narrow underground openings have not yet been realized. Small, versatile mechanized bolting equipment needs to be developed to address jackleg-drill-related accidents and improve safety at mines where jackleg drills are being used. Metallurgy and Exploration. |
The burden of influenza-associated critical illness hospitalizations
Ortiz JR , Neuzil KM , Shay DK , Rue TC , Neradilek MB , Zhou H , Seymour CW , Hooper LG , Cheng PY , Goss CH , Cooke CR . Crit Care Med 2014 42 (11) 2325-32 OBJECTIVE: Influenza is the most common vaccine-preventable disease in the United States; however, little is known about the burden of critical illness due to influenza virus infection. Our primary objective was to estimate the proportion of all critical illness hospitalizations that are attributable to seasonal influenza. DESIGN: Retrospective cohort study. SETTING: Arizona, California, and Washington from January 2003 to March 2009. PATIENTS: All adults hospitalized with critical illness, defined by International Classification of Diseases, 9th Edition, Clinical Modification diagnosis and procedure codes for acute respiratory failure, severe sepsis, or in-hospital death. MEASUREMENTS AND MAIN RESULTS: We combined the complete hospitalization discharge databases for three U.S. states, regional influenza virus surveillance, and state census data. Using negative binomial regression models, we estimated the incidence rates of adult influenza-associated critical illness hospitalizations and compared them with all-cause event rates. We also compared modeled outcomes to International Classification of Diseases, 9th Edition, Clinical Modification-coded influenza hospitalizations to assess potential underrecognition of severe influenza disease. During the study period, we estimated that 26,760 influenza-associated critical illness hospitalizations (95% CI, 14,541, 47,464) occurred. The population-based incidence estimate for influenza-associated critical illness was 12.0 per 100,000 person-years (95% CI, 6.6, 21.6) or 1.3% of all critical illness hospitalizations (95% CI, 0.7%, 2.3%). During the influenza season, 3.4% of all critical illness hospitalizations (95% CI, 1.9%, 5.8%) were attributable to influenza. There were only 2,612 critical illness hospitalizations with International Classification of Diseases, 9th Edition, Clinical Modification-coded influenza diagnoses, suggesting influenza is either undiagnosed or undercoded in a substantial proportion of critical illness. CONCLUSIONS: Extrapolating our data to the 2010 U.S. population, we estimate that about 28,000 adults are hospitalized for influenza-associated critical illness annually. Influenza in many of these critically ill patients may be undiagnosed. Critical care physicians should have a high index of suspicion for influenza in the ICU, particularly when influenza is known to be circulating in their communities. |
High prevalence and two dominant host-specific genotypes of Coxiella burnetii in U.S. milk.
Pearson T , Hornstra HM , Hilsabeck R , Gates LT , Olivas SM , Birdsell DM , Hall CM , German S , Cook JM , Seymour ML , Priestley RA , Kondas AV , Clark Friedman CL , Price EP , Schupp JM , Liu CM , Price LB , Massung RF , Kersh GJ , Keim P . BMC Microbiol 2014 14 (1) 41 BACKGROUND: Coxiella burnetii causes Q fever in humans and Coxiellosis in animals; symptoms range from general malaise to fever, pneumonia, endocarditis and death. Livestock are a significant source of human infection as they shed C. burnetii cells in birth tissues, milk, urine and feces. Although prevalence of C. burnetii is high, few Q fever cases are reported in the U.S. and we have a limited understanding of their connectedness due to difficulties in genotyping. Here, we develop canonical SNP genotyping assays to evaluate spatial and temporal relationships among C. burnetii environmental samples and compare them across studies. Given the genotypic diversity of historical collections, we hypothesized that the current enzootic of Coxiellosis is caused by multiple circulating genotypes. We collected A) 23 milk samples from a single bovine herd, B) 134 commercial bovine and caprine milk samples from across the U.S., and C) 400 bovine and caprine samples from six milk processing plants over three years. RESULTS: We detected C. burnetii DNA in 96% of samples with no variance over time. We genotyped 88.5% of positive samples; bovine milk contained only a single genotype (ST20) and caprine milk was dominated by a second type (mostly ST8). CONCLUSIONS: The high prevalence and lack of genotypic diversity is consistent with a model of rapid spread and persistence. The segregation of genotypes between host species is indicative of species-specific adaptations or dissemination barriers and may offer insights into the relative lack of human cases and characterizing genotypes. |
Let's move salad bars to schools: a public-private partnership to increase student fruit and vegetable consumption
Harris DM , Seymour J , Grummer-Strawn L , Cooper A , Collins B , DiSogra L , Marshall A , Evans N . Child Obes 2012 8 (4) 294-7 Few school-age youth consume the recommended amounts of fruits and vegetables, and increasing fruit and vegetable intake in children and adolescents is an important public health goal to maintain long-term good health and to decrease risk of chronic disease and obesity. School salad bars are an important tool to promote fruit and vegetable consumption among schoolchildren. Studies show that introduction of school salad bars increases the amount and variety of fruits and vegetables consumed by children in schools. However, many schools cannot afford the capital investment in the salad bar equipment. In 2010, the National Fruit & Vegetable Alliance (NFVA), United Fresh Produce Association Foundation, the Food Family Farming Foundation, and Whole Foods Market launched Let's Move Salad Bars to Schools (LMSB2S) in support of First Lady Michelle Obama's Let's Move! initiative. The goal of LMSB2S is to place 6000 salad bars in schools over 3 years. As of June, 2012, over 1400 new salad bar units have been delivered to schools across the United States, increasing access to fruits and vegetables for over 700,000 students. Any K through 12 school district participating in the National School Lunch Program is eligible to submit an application at www.saladbars2schools.org/. Requests for salad bar units ($2625 each unit) are fulfilled through grassroots fund raising in the school community and through funds raised by the LMSB2S partners from corporate and foundation sources. LMSB2S is a model for coalition-building across many government, nonprofit, and industry partners to address a major public health challenge. |
Development and validation of Burkholderia pseudomallei-specific real-time PCR assays for clinical, environmental or forensic detection applications
Price EP , Dale JL , Cook JM , Sarovich DS , Seymour ML , Ginther JL , Kaufman EL , Beckstrom-Sternberg SM , Mayo M , Kaestli M , Glass MB , Gee JE , Wuthiekanun V , Warner JM , Baker A , Foster JT , Tan P , Tuanyok A , Limmathurotsakul D , Peacock SJ , Currie BJ , Wagner DM , Keim P , Pearson T . PLoS One 2012 7 (5) e37723 The bacterium Burkholderia pseudomallei causes melioidosis, a rare but serious illness that can be fatal if untreated or misdiagnosed. Species-specific PCR assays provide a technically simple method for differentiating B. pseudomallei from near-neighbor species. However, substantial genetic diversity and high levels of recombination within this species reduce the likelihood that molecular signatures will differentiate all B. pseudomallei from other Burkholderiaceae. Currently available molecular assays for B. pseudomallei detection lack rigorous validation across large in silico datasets and isolate collections to test for specificity, and none have been subjected to stringent quality control criteria (accuracy, precision, selectivity, limit of quantitation (LoQ), limit of detection (LoD), linearity, ruggedness and robustness) to determine their suitability for environmental, clinical or forensic investigations. In this study, we developed two novel B. pseudomallei specific assays, 122018 and 266152, using a dual-probe approach to differentiate B. pseudomallei from B. thailandensis, B. oklahomensis and B. thailandensis-like species; other species failed to amplify. Species specificity was validated across a large DNA panel (>2,300 samples) comprising Burkholderia spp. and non-Burkholderia bacterial and fungal species of clinical and environmental relevance. Comparison of assay specificity to two previously published B. pseudomallei-specific assays, BurkDiff and TTS1, demonstrated comparable performance of all assays, providing between 99.7 and 100% specificity against our isolate panel. Last, we subjected 122018 and 266152 to rigorous quality control analyses, thus providing quantitative limits of assay performance. Using B. pseudomallei as a model, our study provides a framework for comprehensive quantitative validation of molecular assays and provides additional, highly validated B. pseudomallei assays for the scientific research community. |
Novel chikungunya vaccine candidate with an IRES-based attenuation and host range alteration mechanism
Plante K , Wang E , Partidos CD , Weger J , Gorchakov R , Tsetsarkin K , Borland EM , Powers AM , Seymour R , Stinchcomb DT , Osorio JE , Frolov I , Weaver SC . PLoS Pathog 2011 7 (7) e1002142 Chikungunya virus (CHIKV) is a reemerging mosquito-borne pathogen that has recently caused devastating urban epidemics of severe and sometimes chronic arthralgia. As with most other mosquito-borne viral diseases, control relies on reducing mosquito populations and their contact with people, which has been ineffective in most locations. Therefore, vaccines remain the best strategy to prevent most vector-borne diseases. Ideally, vaccines for diseases of resource-limited countries should combine low cost and single dose efficacy, yet induce rapid and long-lived immunity with negligible risk of serious adverse reactions. To develop such a vaccine to protect against chikungunya fever, we employed a rational attenuation mechanism that also prevents the infection of mosquito vectors. The internal ribosome entry site (IRES) from encephalomyocarditis virus replaced the subgenomic promoter in a cDNA CHIKV clone, thus altering the levels and host-specific mechanism of structural protein gene expression. Testing in both normal outbred and interferon response-defective mice indicated that the new vaccine candidate is highly attenuated, immunogenic and efficacious after a single dose. Furthermore, it is incapable of replicating in mosquito cells or infecting mosquitoes in vivo. This IRES-based attenuation platform technology may be useful for the predictable attenuation of any alphavirus. |
Probing the attenuation and protective efficacy of a candidate chikungunya virus vaccine in mice with compromised interferon (IFN) signaling
Partidos CD , Weger J , Brewoo J , Seymour R , Borland EM , Ledermann JP , Powers AM , Weaver SC , Stinchcomb DT , Osorio JE . Vaccine 2011 29 (16) 3067-73 Chikungunya virus (CHIKV) is a mosquito-borne alphavirus that causes explosive outbreaks of febrile illness associated with rash, and painful arthralgia. The CHIK vaccine strain 181/clone25 (181/25) developed by the United States Army Medical Research Institute of Infectious Diseases (USAMRIID) was shown to be well-tolerated and highly immunogenic in phase I and II clinical trials although it induced transient arthralgia in some healthy adult volunteers. In an attempt to better understand the host factors that are involved in the attenuating phenotype of CHIK 181/25 vaccine virus we conducted studies in interferon (IFN)-compromised mice and also evaluated its immunogenic potential and protective capacity. Infection of AG129 mice (defective in IFN-alpha/beta and IFN-gamma receptor signaling) with CHIK 181/25 resulted in rapid mortality within 3-4 days. In contrast, all infected A129 mice (defective in IFN-alpha/beta receptor signaling) survived with temporary morbidity characterized by ruffled appearance and body weight loss. A129 heterozygote mice that retain partial IFN-alpha/beta receptor signaling activity remained healthy. Infection of A129 mice with CHIK 181/25 induced significant levels of IFN-gamma and IL-12 while the inflammatory cytokines, TNFalpha and IL-6 remained low. A single administration of the CHIK 181/25 vaccine provided both short-term and long-term protection (38 days and 247 days post-prime, respectively) against challenge with wt CHIKV-La Reunion (CHIKV-LR). This protection was at least partially mediated by antibodies since passively transferred immune serum protected both A129 and AG129 mice from wt CHIKV-LR and 181/25 virus challenge. Overall, these data highlight the importance of IFNs in controlling CHIK 181/25 vaccine and demonstrate the ability of this vaccine to elicit neutralizing antibody responses that confer short-and long-term protection against wt CHIKV-LR challenge. |
What works? Process evaluation of a school-based fruit and vegetable distribution program in Mississippi
Potter SC , Schneider D , Coyle KK , May G , Robin L , Seymour J . J Sch Health 2011 81 (4) 202-211 BACKGROUND: During the 2004-2005 school year, the Mississippi Department of Education, Office of Child Nutrition, initiated a pilot program to distribute free fruit and vegetable snacks to students during the school day. This article describes the first-year implementation of the Mississippi Fruit and Vegetable Pilot Program. METHODS: The process evaluation addressed where, when, and how produce was distributed; what was distributed; challenges and successes; and recommended modifications. Five of the 25 program schools were selected to participate in the evaluation; selection was based on grade levels served and demographic characteristics. Data were collected from program staff (N = 11) and administrators (N = 6) via interviews and logs; student (N = 42) and parent (N = 19) focus groups; student questionnaires (N = 660); and school staff questionnaires (N = 207). RESULTS: Distributing fresh fruit and vegetable snacks at school was well received by staff and students. Most schools distributed the fresh fruit and vegetable snacks at morning break in classrooms or a central courtyard. Twenty-two types of fresh fruit, 4 types of dried fruit, and 7 types of vegetables were served to students during the program year. Commonly distributed fruit included apples, oranges, pears, bananas, and tangerines. Carrots were the staple vegetable, followed by celery. Key challenges included getting students to try new foods and receiving the produce in a timely manner without spoiling. Main successes included seeing students try new fruit and vegetable snacks, having the program run smoothly, and teacher support. CONCLUSIONS: The program fit well within the school structure and could be an effective component of a multifaceted approach to enhancing child nutrition. |
Long-term stability of a backfilled room-and-pillar test section at the Buick Mine, Missouri, USA
Tesarik DR , Seymour JB , Yanske TR . Int J Rock Mech Min Sci 2009 46 (7) 1182-1196 Rock mechanics instruments have been providing data in a backfilled room-and-pillar test section of the Buick Mine near Boss, Missouri, USA, for nearly 16 years. Host rock instruments include borehole extensometers installed in the mine roof and pillars, and biaxial stressmeters used in pillars and abutments. Embedment strain gauges, extensometers, and earth pressure cells were installed in the cemented backfill. The instruments monitored stability of the test section for two years while the pillars were extracted, and 14 years after pillar extraction to monitor long-term stability. Of the transducers that were not mined out when the pillars were extracted, 68% still function. Data from these instruments demonstrate that backfill improves long-term underground safety by supporting the mine roof and maintaining the strength of support pillars. For example, backfill significantly limited the dilation of a remaining support pillar by providing confinement on one side of the pillar. Post-mining stress and strain in the backfill account for 35% and 28% of the total stress and strain that was measured, respectively. Earth pressure cell stress measurements confirmed visual observations that the backfill remained stable. Post-mining stress measurements from the earth pressure cells fit natural log equations as a function of time with r-squared values ranging from 0.76 to 0.98. Natural log equations also described post-mining strain behavior of the backfill with r-squared values ranging from 0.30 to 0.99. Stresses calculated for the backfill by a three-dimensional numerical model of the test area were consistent with those that were measured by earth pressure cells. Published by Elsevier Ltd. |
Distributing free fresh fruit and vegetables at school: results of a pilot outcome evaluation
Coyle KK , Potter S , Schneider D , May G , Robin LE , Seymour J , Debrot K . Public Health Rep 2009 124 (5) 660-9 OBJECTIVES: Consumption of fruit and vegetables among children is generally below recommended levels. This evaluation addressed two questions: (1) To what extent did children's attitudes toward, familiarity with, and preferences for fruit and vegetables change during the school year? and (2) To what extent did children's consumption of fruit and vegetables change during the school year? METHODS: During the 2004-2005 school year, the Mississippi Department of Education, Child Nutrition Programs initiated a pilot program to distribute free fruit and vegetables to students (kindergarten through 12th grade) during the school day. Data were collected in 2004-2005 within a one-group pretest/posttest design using a self-report questionnaire (n=725) and 24-hour dietary recalls (n=207) with a sample of students from five schools in Mississippi. Data were analyzed in 2006-2007. RESULTS: Results showed greater familiarity with fruit and vegetables at all grade levels (p<0.05) and increased preferences for fruit among eighth- and 10th-grade students (p<0.01). Eighth-grade students also reported more positive attitudes toward eating fruit and vegetables (p<0.01), increased perceived self-efficacy to eat more fruit (p<0.01), and increased willingness to try new fruit. Finally, results showed increased consumption of fruit, but not vegetables, among eighth- and 10th-grade students (p<0.001). CONCLUSIONS: Distributing free fruit and vegetables at school may be a viable component of a more comprehensive approach for improving students' nutrition attitudes and behaviors. More program emphasis is needed on ways to promote vegetable consumption. |
- Page last reviewed:Feb 1, 2024
- Page last updated:Apr 29, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure